-
Notifications
You must be signed in to change notification settings - Fork 7
ensure robots.txt via test #650
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
@nearestnabors bump! |
nearestnabors
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With a request
| const robotsPath = join(process.cwd(), "public", "robots.txt"); | ||
| const robotsContent = readFileSync(robotsPath, "utf-8"); | ||
|
|
||
| expect(robotsContent).toContain( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@evantahler Shouldn't it also reference llms.txt? Since, technically, agents be bots?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.
Note
Enhances robots and sitemap validation.
public/robots.txtto include an AI Agent Resources comment linkinghttps://docs.arcade.dev/llms.txt(sitemap line retained)tests/sitemap.test.tsto assertrobots.txtcontains the sitemap URL andllms.txtreference, alongside existing sitemap URL and duplication checksWritten by Cursor Bugbot for commit 44a4d42. This will update automatically on new commits. Configure here.